79 research outputs found

    Integration by parts on the law of the reflecting Brownian motion

    Get PDF
    We prove an integration by parts formula on the law of the reflecting Brownian motion X:=∣B∣X:=|B| in the positive half line, where BB is a standard Brownian motion. In other terms, we consider a perturbation of XX of the form Xϵ=X+ϵhX^\epsilon = X+\epsilon h with hh smooth deterministic function and ϵ>0\epsilon>0 and we differentiate the law of XϵX^\epsilon at ϵ=0\epsilon=0. This infinitesimal perturbation changes drastically the set of zeros of XX for any ϵ>0\epsilon>0. As a consequence, the formula we obtain contains an infinite dimensional generalized functional in the sense of Schwartz, defined in terms of Hida's renormalization of the squared derivative of BB and in terms of the local time of XX at 0. We also compute the divergence on the Wiener space of a class of vector fields not taking values in the Cameron-Martin space.Comment: 32 page

    Fluctuations for a conservative interface model on a wall

    Full text link
    We consider an effective interface model on a hard wall in (1+1) dimensions, with conservation of the area between the interface and the wall. We prove that the equilibrium fluctuations of the height variable converge in law to the solution of a SPDE with reflection and conservation of the space average. The proof is based on recent results obtained with L. Ambrosio and G. Savare on stability properties of Markov processes with log-concave invariant measures

    Approximate maximizers of intricacy functionals

    Full text link
    G. Edelman, O. Sporns, and G. Tononi introduced in theoretical biology the neural complexity of a family of random variables. This functional is a special case of intricacy, i.e., an average of the mutual information of subsystems whose weights have good mathematical properties. Moreover, its maximum value grows at a definite speed with the size of the system. In this work, we compute exactly this speed of growth by building "approximate maximizers" subject to an entropy condition. These approximate maximizers work simultaneously for all intricacies. We also establish some properties of arbitrary approximate maximizers, in particular the existence of a threshold in the size of subsystems of approximate maximizers: most smaller subsystems are almost equidistributed, most larger subsystems determine the full system. The main ideas are a random construction of almost maximizers with a high statistical symmetry and the consideration of entropy profiles, i.e., the average entropies of sub-systems of a given size. The latter gives rise to interesting questions of probability and information theory

    Conservative stochastic Cahn--Hilliard equation with reflection

    Full text link
    We consider a stochastic partial differential equation with reflection at 0 and with the constraint of conservation of the space average. The equation is driven by the derivative in space of a space--time white noise and contains a double Laplacian in the drift. Due to the lack of the maximum principle for the double Laplacian, the standard techniques based on the penalization method do not yield existence of a solution. We propose a method based on infinite dimensional integration by parts formulae, obtaining existence and uniqueness of a strong solution for all continuous nonnegative initial conditions and detailed information on the associated invariant measure and Dirichlet form.Comment: Published in at http://dx.doi.org/10.1214/009117906000000773 the Annals of Probability (http://www.imstat.org/aop/) by the Institute of Mathematical Statistics (http://www.imstat.org

    A probabilistic study of neural complexity

    Get PDF
    G. Edelman, O. Sporns, and G. Tononi have introduced the neural complexity of a family of random variables, defining it as a specific average of mutual information over subfamilies. We show that their choice of weights satisfies two natural properties, namely exchangeability and additivity, and we call any functional satisfying these two properties an intricacy. We classify all intricacies in terms of probability laws on the unit interval and study the growth rate of maximal intricacies when the size of the system goes to infinity. For systems of a fixed size, we show that maximizers have small support and exchangeable systems have small intricacy. In particular, maximizing intricacy leads to spontaneous symmetry breaking and failure of uniqueness.Comment: minor edit

    A Renewal version of the Sanov theorem

    Get PDF
    Large deviations for the local time of a process XtX_t are investigated, where Xt=xiX_t=x_i for t∈[Si−1,Si[t \in [S_{i-1},S_i[ and (xj)(x_j) are i.i.d.\ random variables on a Polish space, SjS_j is the jj-th arrival time of a renewal process depending on (xj)(x_j). No moment conditions are assumed on the arrival times of the renewal process.Comment: 13 page
    • …
    corecore